2,385 research outputs found

    Gateway Modeling and Simulation Plan

    Get PDF
    This plan institutes direction across the Gateway Program and the Element Projects to ensure that Cross Program M&S are produced in a manner that (1) generate the artifacts required for NASA-STD-7009 compliance, (2) ensures interoperability of M&S exchanged and integrated across the program and, (3) drives integrated development efforts to provide cross-domain integrated simulation of the Gateway elements, space environment, and operational scenarios. This direction is flowed down via contractual enforcement to prime contractors and includes both the GMS requirements specified in this plan and the NASASTD- 7009 derived requirements necessary for compliance. Grounding principles for management of Gateway Models and Simulations (M&S) are derived from the Columbia Accident Investigation Board (CAIB) report and the Diaz team report, A Renewed Commitment to Excellence. As an outcome of these reports, and in response to Action 4 of the Diaz team report, the NASA Standard for Models and Simulations, NASA-STD-7009 was developed. The standard establishes M&S requirements for development and use activities to ensure proper capture and communication of M&S pedigree and credibility information to Gateway program decision makers. Through the course of the Gateway program life cycle M&S will be heavily relied upon to conduct analysis, test products, support operations activities, enable informed decision making and ultimately to certify the Gateway with an acceptable level of risk to crew and mission. To reduce risk associated with M&S influenced decisions, this plan applies the NASA-STD-7009 requirements to produce the artifacts that support credibility assessments and ensure the information is communicated to program management

    Stochastic simulation to improve land-cover estimates derived from coarse spatial resolution satellite imagery = La simulation stochastique pour améliorer les estimations de la couverture des sols à partir d'images satellitales à résolution spatiale grossière

    Full text link
    Thèse diffusée initialement dans le cadre d'un projet pilote des Presses de l'Université de Montréal/Centre d'édition numérique UdeM (1997-2008) avec l'autorisation de l'auteur

    Mechanism of Oxygen Activation in Hydroxylation Reactions Involving Cytochrome P450

    Get PDF
    In the 20-37 °c range the kinetics of cyclohexene epoxidation, naphthalene and cyclohexane hydroxylation and oxidative demethylation of a group of amines in the presence of rat liver microsomes, NADPH and 0 2 has been studied

    dReDBox: Materializing a full-stack rack-scale system prototype of a next-generation disaggregated datacenter

    Get PDF
    Current datacenters are based on server machines, whose mainboard and hardware components form the baseline, monolithic building block that the rest of the system software, middleware and application stack are built upon. This leads to the following limitations: (a) resource proportionality of a multi-tray system is bounded by the basic building block (mainboard), (b) resource allocation to processes or virtual machines (VMs) is bounded by the available resources within the boundary of the mainboard, leading to spare resource fragmentation and inefficiencies, and (c) upgrades must be applied to each and every server even when only a specific component needs to be upgraded. The dRedBox project (Disaggregated Recursive Datacentre-in-a-Box) addresses the above limitations, and proposes the next generation, low-power, across form-factor datacenters, departing from the paradigm of the mainboard-as-a-unit and enabling the creation of function-block-as-a-unit. Hardware-level disaggregation and software-defined wiring of resources is supported by a full-fledged Type-1 hypervisor that can execute commodity virtual machines, which communicate over a low-latency and high-throughput software-defined optical network. To evaluate its novel approach, dRedBox will demonstrate application execution in the domains of network functions virtualization, infrastructure analytics, and real-time video surveillance.This work has been supported in part by EU H2020 ICTproject dRedBox, contract #687632.Peer ReviewedPostprint (author's final draft

    The effect of chest compression frequency on the quality of resuscitation by lifeguards. A prospective randomized crossover multicenter simulation trial

    Get PDF
    BACKGROUND: The ability to perform high-quality cardiopulmonary resuscitation is one of the basic skills for lifeguards. The aim of the study was to assess the influence of chest compression frequency on the quality of the parameters of chest compressions performed by lifeguards. METHODS: This prospective observational, randomized, crossover simulation study was performed with 40 lifeguards working in Warsaw, Wroclaw, and Poznan, Poland. The subjects then participated in a target study, in which they were asked to perform 2-min cycles of metronome-guided chest compressions at different rates: 80, 90, 100, 110, 120, 130, 140, and 150 compressions per minute (CPM). RESULTS: The study involved 40 lifeguards. Optimal chest compression score calculated by manikin software was achieved for 110-120 CPM. Chest compression depth achieved 53 (interquartile range [IQR] 52-54) mm, 56 (IQR 54-57) mm, 52.5 (IQR 50-54) mm, 53 (IQR 52-53) mm, 50 (IQR 49-51) mm, 47 (IQR 44-51) mm, 41 (IQR 40-42) mm, 38 (IQR 38-43) mm for 80, 90, 100, 110, 120, 130, 140 and 150 CPM, respectively. The percentage of chest compressions with the correct depth was lower for rates exceeding 120 CPM. CONCLUSIONS: The rate of 100-120 CPM, as recommended by international guidelines, is the optimal chest compression rate for cardiopulmonary resuscitation performed by lifeguards. A rate above 120 CPM was associated with a dramatic decrease in chest compression depth and overall chest compression quality. The role of full chest recoil should be emphasized in basic life support training

    High intensity neutrino oscillation facilities in Europe

    Get PDF
    The EUROnu project has studied three possible options for future, high intensity neutrino oscillation facilities in Europe. The first is a Super Beam, in which the neutrinos come from the decay of pions created by bombarding targets with a 4 MW proton beam from the CERN High Power Superconducting Proton Linac. The far detector for this facility is the 500 kt MEMPHYS water Cherenkov, located in the Fréjus tunnel. The second facility is the Neutrino Factory, in which the neutrinos come from the decay of μ+ and μ− beams in a storage ring. The far detector in this case is a 100 kt magnetized iron neutrino detector at a baseline of 2000 km. The third option is a Beta Beam, in which the neutrinos come from the decay of beta emitting isotopes, in particular He6 and Ne18, also stored in a ring. The far detector is also the MEMPHYS detector in the Fréjus tunnel. EUROnu has undertaken conceptual designs of these facilities and studied the performance of the detectors. Based on this, it has determined the physics reach of each facility, in particular for the measurement of CP violation in the lepton sector, and estimated the cost of construction. These have demonstrated that the best facility to build is the Neutrino Factory. However, if a powerful proton driver is constructed for another purpose or if the MEMPHYS detector is built for astroparticle physics, the Super Beam also becomes very attractive

    Tracing the evolution of NGC6397 through the chemical composition of its stellar populations

    Full text link
    With the aim to constrain multiple populations in the metal-poor globular cluster NGC6397, we analyse and discuss the chemical compositions of a large number of elements in 21 red giant branch stars. High-resolution spectra were obtained with the FLAMES/UVES spectrograph on VLT. We have determined non-LTE abundances of Na and LTE abundances for the remaining 21 elements, including O, Mg, Al, alpha, iron-peak, and neutron-capture elements, many of which have not previously been analysed for this cluster. We have also considered the influence of possible He enrichment in the analysis of stellar spectra. We find that the Na abundances of evolved, as well as unevolved, stars show a distinct bimodality, which suggests the presence of two stellar populations; one primordial stellar generation with composition similar to field stars, and a second generation that is enriched in material processed through hydrogen-burning (enriched in Na and Al and depleted in O and Mg). The cluster is dominated (75%) by the second generation. The red giant branch show a similar bimodal distribution in the Stroemgren colour index c_y=c_1-(b-y), implying a large difference also in N abundance. The two populations have the same composition of all analysed elements heavier than Al, within the measurement uncertainty of the analysis, with the possible exception of [Y/Fe]. Using two stars with close to identical stellar parameters, one from each generation, we estimate the difference in He content, Delta Y=0.01+-0.06, given the assumption that the mass fraction of iron is the same for the stars. Finally, we show that winds from fast rotating massive stars of the first generation can be held responsible for the abundance patterns observed in NGC6397 second generation long-lived stars and estimate that the initial mass of the cluster were at least ten times higher than its present-day value.Comment: 13 pages + appendix with two tables. Accepted for publication in A&A. v2: minor language corrections and Table A.2. correcte

    Coupling early warning services, crowdsourcing, and modelling for improved decision support and wildfire emergency management

    Full text link
    The threat of a forest fire disaster increases around the globe as the human footprint continues to encroach on natural areas and climate change effects increase the potential of extreme weather. It is essential that the tools to educate, prepare, monitor, react, and fight natural fire disasters are available to emergency managers and responders and reduce the overall disaster effects. In the context of the I-REACT project, such a big crisis data system is being developed and is based on the integration of information from different sources, automated data processing chains and decision support systems. This paper presents the wildfire monitoring for emergency management system for those involved and affected by wildfire disasters developed for European forest fire disasters

    Mechanism of Oxygen Activation in Hydroxylation Reactions Involving Cytochrome P450

    Get PDF
    In the 20-37 °c range the kinetics of cyclohexene epoxidation, naphthalene and cyclohexane hydroxylation and oxidative demethylation of a group of amines in the presence of rat liver microsomes, NADPH and 0 2 has been studied
    corecore